Connectionist-Inspired Incremental PCFG Parsing
نویسندگان
چکیده
Probabilistic context-free grammars (PCFGs) are a popular cognitive model of syntax (Jurafsky, 1996). These can be formulated to be sensitive to human working memory constraints by application of a right-corner transform (Schuler, 2009). One side-effect of the transform is that it guarantees at most a single expansion (push) and at most a single reduction (pop) during a syntactic parse. The primary finding of this paper is that this property of right-corner parsing can be exploited to obtain a dramatic reduction in the number of random variables in a probabilistic sequence model parser. This yields a simpler structure that more closely resembles existing simple recurrent network models of sentence comprehension.
منابع مشابه
Parsing TCT with a Coarse-to-fine Approach
A key observation is that concept compound constituent labels are detrimental to parsing performance. We use a PCFG parsing algorithm that uses a multilevel coarse-to-fine scheme. Our approach requires a sequence of nested partitions or equivalence classes of the PCFG nonterminals, where the nonterminals of each PCFG are clusters of nonterminals of the finer PCFG. We use the results of parsing ...
متن کاملData-driven, PCFG-based and Pseudo-PCFG-based Models for Chinese Dependency Parsing
We present a comparative study of transition-, graphand PCFG-based models aimed at illuminating more precisely the likely contribution of CFGs in improving Chinese dependency parsing accuracy, especially by combining heterogeneous models. Inspired by the impact of a constituency grammar on dependency parsing, we propose several strategies to acquire pseudo CFGs only from dependency annotations....
متن کاملDeriving lexical and syntactic expectation-based measures for psycholinguistic modeling via incremental top-down parsing
A number of recent publications have made use of the incremental output of stochastic parsers to derive measures of high utility for psycholinguistic modeling, following the work of Hale (2001; 2003; 2006). In this paper, we present novel methods for calculating separate lexical and syntactic surprisal measures from a single incremental parser using a lexicalized PCFG. We also present an approx...
متن کاملImplications of a Structured Vecto- rial Semantic Framework for Incre- mental Interpretation
Previous corpus studies have shown that the memory usage of a syntactic processing model based on a simple right-corner grammar transform (Schuler et al. 2008, 2010) corresponds to recent estimates of human short-term memory capacity (Miller 1956, Cowan 2001). This article describes a study of the memory usage of a straightforward extension of this model to incremental semantic processing, show...
متن کاملFeature-Rich Log-Linear Lexical Model for Latent Variable PCFG Grammars
Context-free grammars with latent annotations (PCFG-LA) have been found to be effective for parsing many languages; however, currently their lexical model may be subject to over-fitting and requires language engineering to handle out-ofvocabulary (OOV) words. Inspired by previous studies that have incorporated rich features into generative models, we propose to use a feature-rich log-linear lex...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012